Free Download Sign-up Form
* Email
First Name
* = Required Field


Mind Your Head Brain Training Book by Sue Stebbins and Carla Clark
New!
by Sue Stebbins &
Carla Clark

Paperback Edition

Kindle Edition

Are You Ready to Breakthrough to Freedom?
Find out
Take This Quiz

Business Breakthrough CDs

Over It Already

Amazing Clients
~ Ingrid Dikmen Financial Advisor, Senior Portfolio Manager


~ Mike M - Finance Professional

Social Media Sue Stebbins on Facebook

Visit Successwave's Blog!

Subscribe to the Successwaves RSS Feed

Are Theories of Imagery Theories of Imagination? An Active Perception Approach to Conscious Mental Content

Nigel J.T. Thomas

1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 | 11 | 12

Page 5

Source: http://cogprints.org/5018/1/im-im-cp.htm

The information processing approach to perception originated in computer vision research, where it was long taken as axiomatic that the task of a machine vision system was to process the signal produced by its TV-camera sensor into an internal descriptive model of the world before the camera's lens. However, this orthodoxy has recently been challenged by the so called active vision (oractive perception) approach (Bajcsy, 1988; Ballard, 1991; Bajcsy & Campos, 1992; Blake & Yuille, 1992; Aloimonos, 1992, 1993; Swain & Stricker, 1993; Landy, Maloney, & Pavel, 1996; Scassellati, 1998). This approach is clearly influenced by Gibson, but may be seen as part of the wider movement toward understanding cognitive systems in terms of their environmental situatedness (Clancey, 1997). Active vision systems are typically robotic rather than merely computational, and instead of being used to build a comprehensive inner model of its surroundings, the robot's perceptual capacities are simply used to obtain whatever specific pieces of information are currently necessary for the ongoing control of the its behavior in the world.

Visual sensory data is analyzed purposefully in order to answer specific queries posed by the [robot] observer. The observer constantly adjusts its vantage point in order to uncover the piece of information that is immediately most pressing. (Blake & Yuille, 1992--emphasis added).

"Activity" should here be understood in a strong, literal sense that goes beyond and subsumes views that invoke it merely in the sense of top-down control of information processing within the computational architecture of a vision system (or a brain). Not only does the robot actually move its TV-camera sensors to point at whatever it presently needs to know about; different specialized algorithms are purposefully applied to extract the specific sort of information (e.g. shape, distance, velocity, texture) that the system currently requires. Often, these algorithms themselves may involve purposefully moving the sensor relative to what is being examined (Aloimonos, Weiss, & Bandyopadhyay, 1988), actively picking up information (c.f. Gibson, 1966, 1979) rather than passively transducing.

Although they have not (to my knowledge) applied their ideas to imagery, active vision researchers seem to have revived the general outlook on perception formerly taken by PA theorists. With certain more neuroscientifically oriented thinkers also moving in a similar direction (Churchland, Ramachandran, & Sejnowski, 1994; Cotterill, 1997), perhaps the time is ripe to revive PA imagery theory itself.

2.3.1 Sketch of a Perceptual Activity Theory

In consideration of the underdeveloped state of extant versions of PA theory, and the failure of most Gibsonians and Situated Cognition researchers to apply their insights to imagery, in this section I give my own tentative synthesis. Conjectures about matters of detail are intended as merely illustrative. I do not aim to provide a finished, detailed theory, but, rather, some concrete sense of how such a theory might work.

In the spirit of the proceduralist approach to memory (Kolers & Roediger, 1984), perceptual learning is not viewed as a matter of storing descriptions (or pictures) of perceived scenes or objects, but as the continual updating and refining of procedures (or "schemata" [Neisser, 1976]) that specify how to direct our attention most effectively in particular situations: how to efficiently examine and explore, and thus interpret, a scene or object of a certain type (Stark & Ellis, 1981). Through such processes of controlled perceptual exploration we collect the information that takes us from a vague, preattentive appreciation that something is out there, to a detailed understanding of just what it is. We engage in "a rapid sequence of microperceptions and microreactions, almost simultaneous as far as consciousness is concerned" (Damasio & Damasio, 1992), and it is through this attentive process of searching out the distinctive features and feature complexes of the things before us, that we come to recognize and categorize them, to perceive them as whatever they may be.

On this view, no end-product of perception, no inner picture or description is ever created. No thing in the brain is the percept or image. Rather, perceptual experience consists in the ongoing activity of schema-guided perceptual exploration of the environment. Imagery is experienced when a schema that is not directly relevant to the exploration of the current environment is allowed at least partial control of the exploratory apparatus. We imagine, say, a cat, by going through (some of) the motions of examining something and finding that it is a cat, even though there is no cat (and perhaps nothing relevant at all) there to be examined. Imagining a cat is seeing nothing-in-particular as a cat (Ishiguro, 1967).

Instead of viewing perception as a matter of the inflow of information into the brain, and eventually into the properly mental data format, PA theory, like active vision robotics, views it as a continual process of active interrogation of the environment:

impulses in the optic nerve fibres at each moment of scanning a scene are the answers, in code, to the "questions" that had been asked at the previous moment. . . . [What] goes on in the retina is not like the recording of a "picture", but the detection of a series of items, which are reported to the brain (Young, 1978).

Each new "question" takes its cue from the answers to the previous ones. The stored procedures direct our sensory systems to make what amount to tests and measurements on our environment, and the results obtained influence not only our ongoing general behavior but feed back to influence the course of the ongoing perceptual interrogation itself. These procedures or schemata are best understood not as fixed, linear sets of instructions, but as having a branching network or tree structure (Rumelhart, 1980)5 so that the results of each test can feed back to influence which branch will be followed, and thus which further tests will be performed:

Information gathering is a dynamic process that responds at once to events in the visual world, to the system's evolving understanding of that world, and to changing requirements of the vision task (Swain & Stricker, 1993).

This view of perception as the making of active tests and measurements is applicable to all sense modes, including taste and smell (Halpern, 1983), but perhaps may be explained best by considering the haptic mode. Mere passive touch tells us little, but by actively exploring an object with our hands we can find out a great deal. Our hands incorporate not only sensory transducers, but musculature which, under central control, moves them in appropriate ways. Lederman and Klatzky (1990) have observed a number of specific haptic tests which subjects may apply to an object to reveal different properties. For example, hefting something tells about its weight, rubbing it reveals texture, enclosing it in the hand or running the fingers around the contours provides shape information, and squeezing reveals compressibility. We might, then, think of the hand together with the neural structures that control hefting and that analyze the afferent signals that it generates as a sort of instrument for estimating weight. Together with other neural structures the hand becomes part of an instrument for detecting shape, texture and so on. I will use the term "perceptual instrument" to mean a complex of physiological structures that is capable of actively testing for the presence or amplitude of some specific type of environmental property. The notion is similar to what has elsewhere been called a "smart sensor" or "smart perceptual mechanism" (Runeson, 1977; Burt, 1988). (I use "instrument" rather than the more conventional "detector" or "sensor" to stress the requirement for active deployment under cognitive control.)

The properties tested may sometimes be complex, relational ones. In robotic active vision systems the TV-camera sensor may function at one moment as part of an instrument for determining shape from shading, at another as part of an instrument for estimating relative distance from parallax, and so on, according to which particular algorithm currently has control of it (control of both its motion and the analysis of its output). Ballard (1991) calls this capacity of a single sensor to function as needed as a component of different perceptual instruments "sensor fission". The PA theorist will hold that human vision, indeed perception in general, may be usefully thought of in a similar way: We do not so much have five, general purpose senses, as a large array of anatomically overlapping perceptual instruments, a capacious "box of tricks" (Ramachandran, 1990).

1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 | 11 | 12

We Make it Easy to Succeed
Successwaves, Intl.
Brain Based Accelerated Success Audios

Successwaves Smart Coaching Audio